Pointing Angle Estimation for Human-Robot Interface
نویسندگان
چکیده
منابع مشابه
Human Pointing Navigation Interface for Mobile Robot with Spherical Vision System
Human-robot interaction requires intuitive interface that is not possible using devices, such as, the joystick or teaching pendant, which also require some trainings. Instruction by gesture is one example of an intuitive interfaces requiring no training, and pointing is one of the simplest gestures. We propose simple pointing recognition for a mobile robot having an upwarddirected camera system...
متن کاملRobot Guidance by Human Pointing Gestures
In the search for intuitive man-machine interfaces the visual recognition of human gestures would provide a powerful means to guide robot movements. In our paper we report on the development of the modular neural system \See-Eagle" for the visual guidance of robot pick-and-place actions. Several neural networks are integrated to a single system that visually recognizes human hand pointing gestu...
متن کاملVisual recognition of pointing gestures for human-robot interaction
In this paper, we present an approach for recognizing pointing gestures in the context of human–robot interaction. In order to obtain input features for gesture recognition, we perform visual tracking of head, hands and head orientation. Given the images provided by a calibrated stereo camera, color and disparity information are integrated into a multi-hypothesis tracking framework in order to ...
متن کاملRecognition of 3D-Pointing Gestures for Human-Robot-Interaction
We present a system capable of visually detecting pointing gestures performed by a person interacting with a robot. The 3Dtrajectories of the person’s head and hands are extracted from image sequences provided by a stereo camera. We use Hidden Markov Models trained on different phases of sample pointing gestures to detect the occurrence of pointing gestures. For the estimation of pointing direc...
متن کاملA Gesture Interface for Human-Robot-Interaction
We present a person-independent gesture interface implemented on a real robot which allows the user to give simple commands ,e.g., how to grasp an object and where to put it. The gesture analysis relies on realtime tracking of the user's hand and a re ned analysis of the hand's shape in the presence of varying complex backgrounds.
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Journal of Advances in Computer Networks
سال: 2013
ISSN: 1793-8244
DOI: 10.7763/jacn.2013.v1.16